Neural Network Renormalization Group

نویسندگان

  • Shuo-Hui Li
  • Lei Wang
چکیده

We present a variational renormalization group approach using deep generative model composed of bijectors. The model can learn hierarchical transformations between physical variables and renormalized collective variables. It can directly generate statistically independent physical configurations by iterative refinement at various length scales. The generative model has an exact and tractable likelihood, which provides renormalized energy function of the collective variables and supports unbiased rejection sampling of the physical variables. To train the neural network, we employ probability density distillation, in which the training loss is a variational upper bound of the physical free energy. The approach could be useful for automatically identifying collective variables and effective field theories.

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

Can Renormalization Group Flow End in a Big Mess?

The field theoretical renormalization group equations have many common features with the equations of dynamical systems. In particular, the manner how Callan-Symanzik equation ensures the independence of a theory from its subtraction point is reminiscent of self-similarity in autonomous flows towards attractors. Motivated by such analogies we propose that besides isolated fixed points, the coup...

متن کامل

Time-Dependent Real-Space Renormalization Group Method

In this paper, using the tight-binding model, we extend the real-space renormalization group method to time-dependent Hamiltonians. We drive the time-dependent recursion relations for the renormalized tight-binding Hamiltonian by decimating selective sites of lattice iteratively. The formalism is then used for the calculation of the local density of electronic states for a one dimensional quant...

متن کامل

Self-organized criticality in a network of interacting neurons

This paper contains an analysis of a simple neural network that exhibits self-organized criticality. Such criticality follows from the combination of a simple neural network with an excitatory feedback loop that generates bistability, in combination with an anti-Hebbian synapse in its input pathway. Using the methods of statistical field theory, we show how one can formulate the stochastic dyna...

متن کامل

Prediction of the Liquid Vapor Pressure Using the Artificial Neural Network-Group Contribution Method

In this paper, vapor pressure for pure compounds is estimated using the Artificial Neural Networks and a simple Group Contribution Method (ANN–GCM). For model comprehensiveness, materials were chosen from various families. Most of materials are from 12 families. Vapor pressure data of 100 compounds is used to train, validate and test the ANN-GCM model. Va...

متن کامل

Convolutional Neural Networks Arise From Ising Models and Restricted Boltzmann Machines

Convolutional neural net-like structures arise from training an unstructured deep belief network (DBN) using structured simulation data of 2-D Ising Models at criticality. The convolutional structure arises not just because such a structure is optimal for the task, but also because the belief network automatically engages in block renormalization procedures to “rescale” or “encode” the input, a...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

عنوان ژورنال:
  • CoRR

دوره abs/1802.02840  شماره 

صفحات  -

تاریخ انتشار 2018